Exploiting kernel-based feature weighting and instance clustering to transfer knowledge across domains
نویسندگان
چکیده
Learning invariant features across domains is of vital importance to unsupervised domain adaptation, where classifiers trained on the training examples (source domain) need to adapt to a different set of test examples (target domain) in which no labeled examples are available. In this paper, we propose a novel approach to find the invariant features in the original space and transfer the knowledge across domains. We extract invariant features of input data by a kernel-based feature weighting approach, which exploits distribution difference and instance clustering to find desired features. The proposed method is called the kernel-based feature weighting (KFW) approach and benefits from the maximum mean discrepancy to measure the difference between domains. KFW uses condensed clusters in the reduced domains, the domains that do not contain variant features, to enhance the classification performance. Simultaneous use of feature weighting and instance clustering increases the adaptation and classification performance. Our approach automatically discovers the invariant features across domains and employs them to bridge between source and target domains. We demonstrate the effectiveness of our approach in the task of artificial and real world dataset examinations. Empirical results show that the proposed method outperforms other state-of-the-art methods on the standard transfer learning benchmark datasets.
منابع مشابه
Image Classification via Sparse Representation and Subspace Alignment
Image representation is a crucial problem in image processing where there exist many low-level representations of image, i.e., SIFT, HOG and so on. But there is a missing link across low-level and high-level semantic representations. In fact, traditional machine learning approaches, e.g., non-negative matrix factorization, sparse representation and principle component analysis are employed to d...
متن کاملSelf-Adapted Multi-Task Clustering
Multi-task clustering improves the clustering performance of each task by transferring knowledge across related tasks. Most existing multi-task clustering methods are based on the ideal assumption that the tasks are completely related. However, in many real applications, the tasks are usually partially related, and brute-force transfer may cause negative effect which degrades the clustering per...
متن کاملSample-oriented Domain Adaptation for Image Classification
Image processing is a method to perform some operations on an image, in order to get an enhanced image or to extract some useful information from it. The conventional image processing algorithms cannot perform well in scenarios where the training images (source domain) that are used to learn the model have a different distribution with test images (target domain). Also, many real world applicat...
متن کاملUsing a Relevance Model for performing Feature Weighting
Feature Weighting is one of the most difficult tasks when developing Case Based Reasoning applications. This complexity grows when dealing with ill-defined wide domains with a sparse case base. Moreover, most widely-used feature selection and feature weighting methods assume that features are either relevant in the whole instance space or irrelevant through-out. However, it is often the case th...
متن کاملInstance-Based Learning Techniques of Unsupervised Feature Weighting Do not Perform So Badly!
1 Knowledge Engineering & Machine Learning Group, Technical University of Catalonia, Barcelona, email: {hnunez, miquel}@lsi.upc.es Abstract. The major hypothesis that we will be prove in this paper is that unsupervised learning techniques of feature weighting are not significantly worse than supervised methods, as is commonly believed in the machine learning community. This paper tests the powe...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2017